Path: blob/master/Part 10 - Model Selection And Boosting/Grid Search/[R] Grid Search.ipynb
1341 views
Kernel: R
Grid Search
Data preprocessing
In [1]:
In [2]:
In [3]:
In [4]:
Applying Grid Search to find the best model and the best parameters
In [5]:
Out[5]:
Loading required package: lattice
Loading required package: ggplot2
Attaching package: ‘kernlab’
The following object is masked from ‘package:ggplot2’:
alpha
In [6]:
Out[6]:
Support Vector Machines with Radial Basis Function Kernel
320 samples
2 predictor
2 classes: '0', '1'
No pre-processing
Resampling: Bootstrapped (25 reps)
Summary of sample sizes: 320, 320, 320, 320, 320, 320, ...
Resampling results across tuning parameters:
C Accuracy Kappa
0.25 0.9058640 0.7969638
0.50 0.9065979 0.7987936
1.00 0.9054507 0.7962180
Tuning parameter 'sigma' was held constant at a value of 1.599667
Accuracy was used to select the optimal model using the largest value.
The final values used for the model were sigma = 1.599667 and C = 0.5.
In [7]:
Out[7]:
Fitting classifier to the Training set
In [8]:
Predicting the Test set results
In [9]:
In [10]:
Out[10]:
In [11]:
Out[11]:
Applying k-Fold Cross Validation
In [12]:
In [13]:
In [14]:
Out[14]:
In [15]:
Out[15]:
In [16]:
Out[16]:
This signifies that we are in Low Bias Low Variance category in Bias-Variance TradeOff.
Making the Confusion Matrix
In [17]:
Out[17]:
y_pred
0 1
0 45 6
1 4 25
classifier made 45 + 25 = 75 correct prediction and 6 + 4 = 10 incorect predictions.
Visualising the Training set results
In [18]:
Out[18]:
Visualising the Test set results
In [19]:
Out[19]:
Looks like it is much better the Linear kernel.